Algorithm 1026: Concurrent Alternating Least Squares for Multiple Simultaneous Canonical Polyadic Decompositions

نویسندگان

چکیده

Tensor decompositions, such as CANDECOMP/PARAFAC (CP), are widely used in a variety of applications, chemometrics, signal processing, and machine learning. A broadly method for computing decompositions relies on the Alternating Least Squares (ALS) algorithm. When number components is small, regardless its implementation, ALS exhibits low arithmetic intensity, which severely hinders performance makes GPU offloading ineffective. We observe that, practice, experts often have to compute multiple same tensor, each with small (typically fewer than 20), ultimately find best ones use application at hand. In this paper, we illustrate how tensor can be fused together algorithmic level increase intensity. Therefore, it becomes possible make efficient GPUs further speedups; time technique compatible many enhancements typically ALS, line search, extrapolation, non-negativity constraints. introduce Concurrent algorithm library, offers an interface Matlab, mechanism effectively deal issue that complete different times. Experimental results artificial real datasets demonstrate shorter completion due increased

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Randomized Alternating Least Squares for Canonical Tensor Decompositions: Application to A PDE With Random Data

This paper introduces a randomized variation of the alternating least squares (ALS) algorithm for rank reduction of canonical tensor formats. The aim is to address the potential numerical ill-conditioning of least squares matrices at each ALS iteration. The proposed algorithm, dubbed randomized ALS, mitigates large condition numbers via projections onto random tensors, a technique inspired by w...

متن کامل

Tensor Decompositions, Alternating Least Squares and other Tales

This work was originally motivated by a classification of tensors proposed by Richard Harshman. In particular, we focus on simple and multiple “bottlenecks”, and on “swamps”. Existing theoretical results are surveyed, some numerical algorithms are described in details, and their numerical complexity is calculated. In particular, the interest in using the ELS enhancement in these algorithms is d...

متن کامل

Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation

A local convergence theorem for calculating canonical low-rank tensor approximations (PARAFAC, CANDECOMP) by the alternating least squares algorithm is established. The main assumption is that the Hessian matrix of the problem is positive definite modulo the scaling indeterminacy. A discussion, whether this is realistic, and numerical illustrations are included. Also regularization is addressed.

متن کامل

Multiple Concurrent Recursive Least Squares Identification

A new algorithm, multiple concurrent recursive least squares (MCRLS) is developed for parameter estimation in a system having a set of governing equations describing its behavior that cannot be manipulated into a form allowing (direct) linear regression of the unknown parameters. In this algorithm, the single nonlinear problem is segmented into two or more separate linear problems, thereby enab...

متن کامل

An Alternating Least Squares Algorithm with Application to Image Processing

Least Squares (LS) estimation is a classical problem, often arising in practice. When the dimension of the problem is large, the solution may be difficult to obtain, due to complexity reasons. A general way to reduce the complexity is that of breaking the problem in smaller sub-problems. Following this approach, in the paper we introduce an Alternating Least Squares (ALS) algorithm that finds t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Transactions on Mathematical Software

سال: 2022

ISSN: ['0098-3500', '1557-7295']

DOI: https://doi.org/10.1145/3519383